Iteration Complexity of Feasible Descent Methods Iteration Complexity of Feasible Descent Methods for Convex Optimization
نویسندگان
چکیده
In many machine learning problems such as the dual form of SVM, the objective function to be minimized is convex but not strongly convex. This fact causes difficulties in obtaining the complexity of some commonly used optimization algorithms. In this paper, we proved the global linear convergence on a wide range of algorithms when they are applied to some non-strongly convex problems. In particular, we are the first to prove O(log(1/ )) time complexity of cyclic coordinate descent methods on dual problems of support vector classification and regression.
منابع مشابه
Iteration complexity of feasible descent methods for convex optimization
In many machine learning problems such as the dual form of SVM, the objective function to be minimized is convex but not strongly convex. This fact causes difficulties in obtaining the complexity of some commonly used optimization algorithms. In this paper, we proved the global linear convergence on a wide range of algorithms when they are applied to some non-strongly convex problems. In partic...
متن کاملA family of subgradient-based methods for convex optimization problems in a unifying framework
We propose a new family of subgradientand gradient-based methods which converges with optimal complexity for convex optimization problems whose feasible region is simple enough. This includes cases where the objective function is non-smooth, smooth, have composite/saddle structure, or are given by an inexact oracle model. We unified the way of constructing the subproblems which are necessary to...
متن کاملEfficient parallel coordinate descent algorithm for convex optimization problems with separable constraints: application to distributed MPC
In this paper we propose a parallel coordinate descent algorithm for solving smooth convex optimization problems with separable constraints that may arise e.g. in distributed model predictive control (MPC) for linear network systems. Our algorithm is based on block coordinate descent updates in parallel and has a very simple iteration. We prove (sub)linear rate of convergence for the new algori...
متن کاملOn Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization
The cyclic block coordinate descent-type (CBCD-type) methods, which performs iterative updates for a few coordinates (a block) simultaneously throughout the procedure, have shown remarkable computational performance for solving strongly convex minimization problems. Typical applications include many popular statistical machine learning methods such as elastic-net regression, ridge penalized log...
متن کاملAn Improved Convergence Analysis of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization
The cyclic block coordinate descent-type (CBCD-type) methods have shown remarkable computational performance for solving strongly convex minimization problems. Typical applications includes many popular statistical machine learning methods such as elastic-net regression, ridge penalized logistic regression, and sparse additive regression. Existing optimization literature has shown that the CBCD...
متن کامل